Skip to content

[SPARK-55995][SQL] Support TIMESTAMP WITH LOCAL TIME ZONE in SQL syntax#54813

Closed
pan3793 wants to merge 1 commit intoapache:masterfrom
pan3793:SPARK-55995
Closed

[SPARK-55995][SQL] Support TIMESTAMP WITH LOCAL TIME ZONE in SQL syntax#54813
pan3793 wants to merge 1 commit intoapache:masterfrom
pan3793:SPARK-55995

Conversation

@pan3793
Copy link
Member

@pan3793 pan3793 commented Mar 15, 2026

What changes were proposed in this pull request?

This PR proposes to add SQL syntax TIMESTAMP WITH LOCAL TIME ZONE for TimestampType, as a counterpart of TIMESTAMP WITHOUT TIME ZONE for TimestampNTZType.

Why are the changes needed?

Spark SQL supports TIMESTAMP_LTZ as a counterpart of TIMESTAMP_NTZ, but lacks support
TIMESTAMP WITH LOCAL TIME ZONE as a counterpart of TIMESTAMP WITHOUT TIME ZONE.

-- fine
CREATE TABLE ts_1(ltz TIMESTAMP, ntz TIMESTAMP_NTZ);

-- fine, equivalent to the above case
CREATE TABLE ts_2(ltz TIMESTAMP, ntz TIMESTAMP WITHOUT TIME ZONE);

-- set TIMESTAMP as alias of TIMESTAMP_NTZ
SET spark.sql.timestampType=TimestampNTZ;

-- fine, both are TimestampNTZ
CREATE TABLE ts_3(ltz TIMESTAMP, ntz TIMESTAMP_NTZ);

-- fine, one Timestamp, and one TimestampNTZ
CREATE TABLE ts_4(ltz TIMESTAMP_LTZ, ntz TIMESTAMP_NTZ);

-- this does not work
CREATE TABLE ts_5(ltz TIMESTAMP WITH LOCAL TIME ZONE, ntz TIMESTAMP WITHOUT TIME ZONE);

Does this PR introduce any user-facing change?

This introduces new SQL syntax without breaking backward compatibility.

How was this patch tested?

New UTs are added.

Was this patch authored or co-authored using generative AI tooling?

No.

@pan3793 pan3793 requested a review from gengliangwang March 15, 2026 19:57
@pan3793
Copy link
Member Author

pan3793 commented Mar 17, 2026

@gengliangwang could you please take a look? as you are the author of the timestamp ntz

Copy link
Member

@gengliangwang gengliangwang left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Member

@dongjoon-hyun dongjoon-hyun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1 for adding this feature, but we need to clarify the reason why we choose this SYNTAX.

For example, could you summarize the DBMS's syntaxes in the PR description, @pan3793 ?

AFAIK, this seems to follow Oracle way, but Apache Spark prefers PostgreSQL way traditionally. In PostgreSQL, it's TIMESTAMP WITH TIME ZONE. For MySQL TIMESTAMP is simply this.

cc @cloud-fan , @yaooqinn , @LuciferYang , @peter-toth , too.

@gengliangwang
Copy link
Member

AFAIK, this seems to follow Oracle way, but Apache Spark prefers PostgreSQL way traditionally. In PostgreSQL, it's TIMESTAMP WITH TIME ZONE. For MySQL TIMESTAMP is simply this.

Spark's default timestamp type behaves similarly to TIMESTAMP WITH LOCAL TIME ZONE. This differs from TIMESTAMP WITH TIME ZONE, which explicitly stores both the timestamp and the time zone in a single value.

@pan3793
Copy link
Member Author

pan3793 commented Mar 17, 2026

@dongjoon-hyun Yea, as gengliang said, TIMESTAMP WITH TIME ZONE and TIMESTAMP WITH LOCAL TIME ZONE are two different concepts, https://docs.oracle.com/en/database/oracle/oracle-database/21/nlspg/datetime-data-types-and-time-zone-support.html

AFAIK, PostgreSQL does not support TIMESTAMP WITH LOCAL TIME ZONE, and MySQL's timestamp is equivalent to TIMESTAMP WITH LOCAL TIME ZONE, and MySQL does not support TIMESTAMP WITH TIME ZONE, so this won't introduce confusion.

Additionally, Snowflake and Flink support both TIMESTAMP WITH TIME ZONE and TIMESTAMP WITH LOCAL TIME ZONE, also TIMESTAMP_TZ and TIMESTAMP_LTZ, so it's actually a common practice.

https://docs.snowflake.com/en/sql-reference/data-types-datetime#timestamp-ltz-timestamp-ntz-timestamp-tz

https://nightlies.apache.org/flink/flink-docs-release-2.2/docs/dev/table/types/#timestamp_ltz

@dongjoon-hyun
Copy link
Member

Thank you for the explanation, @gengliangwang and @pan3793 .

@dongjoon-hyun
Copy link
Member

Merged to master for Apache Spark 4.2.0.

@LuciferYang
Copy link
Contributor

late LGTM

@dongjoon-hyun
Copy link
Member

Just FYI, the master branch commit builder has been broken Today due to dev/spark-test-image/lint/Dockerfile build failure. It's irrelevant to this PR because of R packages. I'm looking at the failure independently.

@dongjoon-hyun
Copy link
Member

For the record, the following PR aims to recover master branch CI .

@pan3793
Copy link
Member Author

pan3793 commented Mar 17, 2026

@dongjoon-hyun, thank you for merging and taking care of the CI failure

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants